Edge-Nodes Representation Neural Machine for Link Prediction

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Link Prediction in Networks with Nodes Attributes by Similarity Propagation

The problem of link prediction has attracted considerable recent attention from various domains such as sociology, anthropology, information science, and computer sciences. A link prediction algorithm is proposed based on link similarity score propagation by a random walk in networks with nodes attributes. In the algorithm, each link in the network is assigned a transmission probability accordi...

متن کامل

Link Prediction Based on Graph Neural Networks

Traditional methods for link prediction can be categorized into three main types: graph structure feature-based, latent feature-based, and explicit feature-based. Graph structure feature methods leverage some handcrafted node proximity scores, e.g., common neighbors, to estimate the likelihood of links. Latent feature methods rely on factorizing networks’ matrix representations to learn an embe...

متن کامل

Neural Machine Translation with Source Dependency Representation

Source dependency information has been successfully introduced into statistical machine translation. However, there are only a few preliminary attempts for Neural Machine Translation (NMT), such as concatenating representations of source word and its dependency label together. In this paper, we propose a novel attentional NMT with source dependency representation to improve translation performa...

متن کامل

Neural Representation Learning in Linguistic Structured Prediction

Advances in neural network architectures and training algorithms have demonstrated the effectiveness of representation learning in natural language processing. This thesis stresses the importance of computationally modeling the structure in language, even when learning representations. We propose that explicit structure representations and learned distributed representations can be efficiently ...

متن کامل

Context-dependent word representation for neural machine translation

We first observe a potential weakness of continuous vector representations of symbols in neural machine translation. That is, the continuous vector representation, or a word embedding vector, of a symbol encodes multiple dimensions of similarity, equivalent to encoding more than one meaning of the word. This has the consequence that the encoder and decoder recurrent networks in neural machine t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Algorithms

سال: 2019

ISSN: 1999-4893

DOI: 10.3390/a12010012